<
security> (Or "
trap door", "
wormhole"). A hole in the
security of a system deliberately left in place by designers
or maintainers. The motivation for such holes is not always
sinister; some
operating systems, for example, come out of
the box with privileged accounts intended for use by field
service technicians or the vendor's maintenance programmers.
See also
iron box,
cracker,
worm,
logic bomb.
Historically, back doors have often lurked in systems longer
than anyone expected or planned, and a few have become widely
known. The infamous
RTM worm of late 1988, for example,
used a back door in the
BSD Unix "sendmail(8)"
utility.
Ken Thompson's 1983 Turing Award lecture to the
ACM
revealed the existence of a back door in early
Unix versions
that may have qualified as the most fiendishly clever security
hack of all time. The C compiler contained code that would
recognise when the "login" command was being recompiled and
insert some code recognizing a password chosen by Thompson,
giving him entry to the system whether or not an account had
been created for him.
Normally such a back door could be removed by removing it from
the source code for the compiler and recompiling the compiler.
But to recompile the compiler, you have to *use* the compiler
- so Thompson also arranged that the compiler would *recognise
when it was compiling a version of itself*, and insert into
the recompiled compiler the code to insert into the recompiled
"login" the code to allow Thompson entry - and, of course, the
code to recognise itself and do the whole thing again the next
time around! And having done this once, he was then able to
recompile the compiler from the original sources; the hack
perpetuated itself invisibly, leaving the back door in place
and active but with no trace in the sources.
The talk that revealed this truly moby hack was published as
[
"Reflections on Trusting Trust", "Communications of the ACM
27", 8 (August 1984), pp. 761--763].
[
Jargon File]
(1995-04-25)